NEXT19 | James Bridle | New Dark Age: Is Technology Making the World Harder to Understand?
Summary
TLDRThe speaker, a technologist with a background in computer science, expresses concern over current technology trends, dubbing it a 'New Dark Age.' They discuss the metaphorical 'cloud,' emphasizing its physical reality and societal impact. Drawing parallels between technology and weather, the speaker touches on the historical connection between computation and weather prediction. They critique the opacity of modern tech, its role in climate change, and societal manipulation, citing examples like Amazon warehouses and YouTube's algorithm. The talk concludes with potential maneuvers for positive tech use, emphasizing the importance of understanding and ethical technology design.
Takeaways
- 🌐 The term 'cloud' is a metaphor for vast data centers that power our digital interactions, yet its invisibility and mystery are part of its essence.
- 💡 The historical connection between weather prediction and the development of early computers like ENIAC highlights the intertwined destinies of computation and our understanding of natural systems.
- 📈 Moore's Law, which predicts the doubling of processing power every two years, has influenced our expectations of technological progress, yet it also masks the environmental and computational challenges we face.
- 🔍 The increasing opacity of technology, from algorithmic biases to the hidden labor behind apps, is leading to a lack of public understanding and control, fostering confusion and societal issues.
- 🌡️ Climate change is making weather patterns less predictable, challenging our reliance on historical data for future predictions and underscoring the limitations of technology in addressing global issues.
- 🛍️ The convenience of modern technologies often obscures the human labor and environmental costs, as seen in 'dark kitchens' and Amazon's warehouse practices.
- 🤖 Automation bias, where we overly trust technology, can lead to dangerous outcomes, as shown in studies where even trained professionals follow flawed automated advice.
- 🌍 The Internet and associated technologies are significant, often overlooked, contributors to climate change, challenging the sustainability of our digital practices.
- 🔒 The design of technology into society is leading to surveillance and control, as exemplified by Amazon's warehouse systems and the potential for social media to manipulate behavior.
- 🛑 The talk concludes with a call to action to rethink technology not for solutions per se, but for maneuvers that can redirect its impact towards more ethical, understandable, and democratic uses.
Q & A
What is the title of the talk and what does it signify?
-The title of the talk is 'New Dark Age,' which signifies the speaker's deep concern about the trends in technology and their impact on society, despite coming from a background that loves technology.
What is the significance of the term 'cloud' in the context of the talk?
-In the talk, the term 'cloud' is used to describe the misconception of it being a magical, mysterious place where data is stored and processed. It is emphasized that the cloud is actually physical infrastructure owned by companies and has real-world impacts.
How is the history of weather prediction related to the development of computation?
-The history of weather prediction is intrinsically linked to computation as it was one of the first applications that required massive data processing. Lewis Fry Richardson's work in the 1920s laid the foundation for using mathematics and data to predict the weather, which later required computational power that led to the development of early computers like the ENIAC.
What is Moore's Law and how does it relate to the talk?
-Moore's Law is the observation that processing power doubles approximately every two years. The talk discusses how this law has influenced our expectations of technological progress and the belief in the potential of computers to solve any problem, but also points out its limitations and the issues arising from an overreliance on computational power.
What is 'Eroom's Law' and how does it contrast with Moore's Law?
-Eroom's Law is a term coined to describe the opposite trend of Moore's Law, suggesting that the increase in computational power has not led to a proportional increase in drug discovery. It highlights the failure of sheer computational power to solve complex problems like drug discovery.
How does the speaker connect technology to climate change?
-The speaker connects technology to climate change by pointing out that the electricity required for data processing and AI is a significant driver of climate change, and that technology, including the internet, contributes to environmental issues.
What is 'clear air turbulence' and how is it related to climate change?
-Clear air turbulence is a type of atmospheric disturbance that is unpredictable and is becoming more prevalent due to climate change. It is mentioned in the talk to illustrate how our ability to predict weather patterns is being compromised by the changing climate.
What is 'automation bias' and how does it relate to technology?
-Automation bias is a psychological phenomenon where people tend to overly trust automated systems, even when they provide incorrect information. The talk discusses how this bias can lead to dangerous outcomes, as people are becoming overly reliant on technology and losing their ability to think critically.
What is the 'Keeling curve' and what does it illustrate?
-The Keeling curve is a graph that shows the exponential growth of CO2 in the atmosphere, measured at the Mauna Loa Observatory in Hawaii. It illustrates the increasing levels of CO2, surpassing 400 parts per million, which is a critical marker for climate change discussions.
How does the speaker discuss the impact of technology on labor and society?
-The speaker discusses the impact of technology on labor and society by giving examples such as Amazon warehouses, where workers are monitored and directed by algorithms, leading to de-skilling and a lack of autonomy. This, along with other examples like dark kitchens and Pokemon Go, shows how technology is used to hide labor and manipulate consumers.
What are the 'three small maneuvers' the speaker suggests as a way to counter the negative impacts of technology?
-The 'three small maneuvers' suggested by the speaker include using technology to produce different outcomes, repurposing technology for beneficial purposes, and insisting on transparency and understanding in technology design to ensure it serves democratic and ethical values.
Outlines
🌧️ The Concept of a 'New Dark Age' in Technology
The speaker, a writer, visual artist, and computer scientist, introduces the concept of a 'new Dark Age' in technology, expressing concern about current technological trends despite a background in computer science. The term 'cloud' is dissected, revealing its physical nature as large buildings filled with computers owned by companies with real-world impacts. The speaker emphasizes the importance of understanding technology as an extension of ourselves and society, drawing parallels between technology and natural systems, and highlighting the historical connection between weather prediction and computation. The talk aims to explore the origins of the 'new Dark Age' phrase and its implications on our perception and interaction with technology.
🌐 The Illusion and Reality of the Cloud and Moore's Law
The speaker discusses the cloud as a metaphor for technology's intangible nature, contrasting its ethereal public perception with its physical reality as massive data centers. The historical development of weather prediction and its reliance on computational power is explored, from Lewis Fry Richardson's manual calculations during World War I to the first computerized weather forecasts using the ENIAC. The speaker then transitions to Moore's Law, which predicts the doubling of processing power every two years, and its influence on our expectations of technological progress. However, the speaker introduces 'Eroom's Law' as a counterpoint, illustrating how increased computational power has not led to proportional gains in drug discovery, suggesting limitations in our reliance on technology for problem-solving.
🌡️ Climate Change and the Unpredictability of Modern Technology
The speaker addresses the impact of climate change on the unpredictability of weather, particularly clear air turbulence, and how it challenges our reliance on historical data for future predictions. The speaker also discusses the role of technology, including the internet and big data processing, as significant contributors to climate change. The 'Keeling Curve' is mentioned to illustrate the exponential growth of CO2 in the atmosphere, and the speaker connects high indoor CO2 levels to decreased cognitive function, suggesting that our technological advancements may be inadvertently hindering our ability to address these challenges.
🤖 Automation Bias and the Hidden Consequences of Technology
The speaker delves into the psychological phenomenon of automation bias, where people tend to trust technology even when it leads to incorrect decisions. Examples such as 'death by GPS' and the over-reliance on automated systems in various fields are given to illustrate the dangers of this bias. The speaker also discusses the societal impacts of technology, such as the dehumanizing effects of Amazon's warehouse system and the exploitation of workers in 'dark kitchens' for food delivery apps. The paragraph highlights how technology is designed to hide the labor and human costs behind the convenience it provides.
🕹️ The Manipulation of Reality and Public Opinion Through Technology
The speaker explores how technology is used to manipulate public opinion and reality, citing examples like Pokémon Go, which directs users to advertiser locations, and YouTube's autoplay and recommendation algorithms, which can radicalize viewers by promoting sensational or false content. The speaker also discusses the role of technology in politics, such as the Internet Research Agency's disinformation campaigns aimed at sowing confusion and mistrust. The paragraph emphasizes the ease with which people can be deceived by technology and the broader societal effects of these manipulations.
🛰️ Redirecting Technology for Positive Change
The speaker concludes with a discussion on how technology can be repurposed for positive outcomes. Examples include London delivery drivers using an app to organize a strike and the repurposing of US spy satellites into scientific instruments for space exploration. The speaker argues that technology itself is not the problem but rather how it is used and designed. The speaker advocates for a return to transparent, educative, and ethical technology that empowers rather than confuses or controls people, emphasizing the importance of understanding and participating in the technologies we use.
Mindmap
Keywords
💡New Dark Age
💡Cloud
💡Technology
💡Moore's Law
💡Eroom's Law
💡Climate Change
💡Automation Bias
💡Dark Kitchens
💡Algorithm
💡Disinformation
💡Opacity
Highlights
The speaker identifies as a technologist with a background in computer science, expressing concern about current technology trends.
The term 'new Dark Age' is introduced to describe the potential negative trajectory of technology's impact on society.
The 'cloud' is discussed as a metaphor for the intangibility and mystery surrounding modern technology.
The historical connection between weather prediction and the development of early computers is highlighted.
Lewis Fry Richardson's work in numerical weather prediction is cited as a foundational moment in computational science.
The ENIAC computer's role in both weather prediction and atomic bomb development is noted.
Moore's Law and its influence on expectations of computational power and societal progress are critiqued.
Eroom's Law is introduced as a counterpoint to Moore's Law, illustrating the diminishing returns in drug discovery.
The unpredictability of weather due to climate change challenges the traditional methods of data-based forecasting.
The Internet's significant contribution to climate change is discussed, contrasting its role as a tool for potential solutions.
The 'death by GPS' phenomenon illustrates the dangers of over-reliance on technology.
Automation bias is described, explaining how people tend to trust technology even when it leads to incorrect decisions.
The Amazon warehouse example is used to illustrate the dehumanizing effects of technology on labor.
Dark kitchens are presented as an example of how technology hides the labor behind convenience.
Pokemon GO is criticized for manipulating users' movements for advertising purposes without their knowledge.
YouTube's autoplay and recommendation algorithms are criticized for promoting misinformation and radical views.
The Ashley Madison data leak is used as an example of how easily people can be deceived by technology.
The Internet Research Agency's tactics to spread disinformation and confusion online are discussed.
The potential for technology to generate fake realities, such as deepfakes, is highlighted as a threat to trust.
The speaker argues against 'solutionism', suggesting that technology should be used to produce different outcomes.
The repurposing of a US spy satellite for scientific research is given as an example of positive technological redirection.
The speaker concludes by emphasizing the importance of understanding and ethical use of technology, rather than fearing it.
Transcripts
I am unlike I think most of the people
who've appeared on the stage so far
today as someone who's a works for a
technology company or has something to
sell in fact I'm a I'm a writer I'm a
visual artist I'm a computer scientist
by training I consider myself to be a
technologist a kind of vague word that
means I'm just really really interested
in this stuff the title of this talk is
new Dark Age which sounds grim because
it is because I find myself despite
coming from a as I say a background in
computer science a background as an
internet hippie a kind of joyous nerd
someone who loves technology to be
deeply concerned about many of the
trends we see in technology and thus in
the world today and I'm going to talk a
little bit about where that phrase comes
from and what I mean by it I'm
interested in the way that we talk about
technology primarily because I think
when we talk about technology we're
usually really just talking about us and
it's in our society and the way that we
that we interact with one another and so
an interesting place to start with this
idea of a new Dark Age is with the cloud
which is such a fascinating term to me
we hear this talk of the cloud as though
it's some sort of magical mysterious
faraway place where stuff just happens
you know magically and beautifully we
upload our photos we tell it our secrets
we talked to our friends through it we
give it a lot money but it's sort of
nebulous and invisible and yet it's
always really important to remember that
the cloud is actually very solid it's
huge buildings filled with computers
that are owned by companies that exist
within legal jurisdictions within
particular geographies that have an
impact on the world in in many ways that
we'll talk about but then also I always
want to insist that
that name still tells us something
important about how we interact with it
that it remains sort of cloudy and it's
cloudiness the very uncertainty that it
brings is is super important and I like
this image of Technology and the weather
technology relating it to natural
systems I found looking at that and
particularly in its history to be super
productive in thinking about these
things because there's always been you
know at least for the last century a
very interesting and tight relationship
between our ideas about the weather and
and computation itself this is a few
pages from one of my favorite books this
is Lewis fry Richardson 1922 text
weather prediction by numerical methods
Richardson was one of the first
scientists mathematicians to argue that
would be possible to predict the weather
through data this at the time was a
super radical idea and no one believed
that the the the natural environment was
kind of susceptible to mathematics in
this way but which since proposed a
method which really came to define all
of computation which is that if you
divide the world up into discrete boxes
take certain measurements produced data
you can then compute that data and
predict the future Richardson did this
he wrote a book on it before he wrote
the book he actually did a full weather
calculation he took all of this data for
Western Europe and he worked out point
by point what the weather would be like
but this was before computers he did
this with pen and paper and it took him
about three months to do a single daily
forecast he also did it under shellfire
because it was an ambulance driver in
the first world war at the time it was
an astonishing achievement but he didn't
really imagine that this would actually
be effective because he didn't foresee
computers in fact it took another 30 odd
years before this was transformed into a
computer program this is the same or
very similar weather forecast performed
on a computer and it's in fact the first
24-hour forecast that ran quicker than
24 hours because we finally had
computational power that would keep up
with the actual weather and this is the
computer that was performed on this is
the ENIAC one of the very earliest
computers that was built in the US
during the Second World War
ENIAC one of my absolute favorite
computers can nerd out of this
you want and it's a beautiful thing it
took up two whole rooms here at the
Aberdeen Proving Grounds in Maryland and
it was invented basically to do two
things
to predict the weather and to build
atomic bombs that's basically where
computers come from from weather
prediction and at building atomic
weapons and they contain that history
within them to this day there's a
beautiful little story by one of the
engineers who first worked on the on the
ENIAC a guy called Harry Reid in his
kind of a farewell address he says this
beautiful thing where he said working on
the ENIAC was kind of like living inside
the computer because it completely
contained you it was a very personal
relationship and now we think of a
personal computer is something very
small that we carry around with us at
all times but actually that's not really
true this room sized computer didn't
shrink down at all rather it expanded
and now it includes the whole of the
planet including even goes up into house
space in the form of satellites we all
live inside that computer that Harry
Reid and others envision in the 1950s
and it effects every aspect of our lives
it also really affects the way we kind
of view the world and have expectations
of it in the future this a graph that
many of you I'm sure familiar with is
Moore's law the the rule of thumb
developed in the 1950s that would say
that said the processing power would
double every two years and amazingly it
has held true ever since Gordon Moore
the guy who who came up with this and
from Intel it was just a rule of thumb
it was just something he observed he
didn't really expect it to last and yet
it has and it's kind of got inside our
heads it's produced a kind of idea of
the world that if we only have more
computers and more processing power will
basically be able to achieve anything
we'll always have this fuel for total
expansion and that's meant to be a
slightly worrying phrase for the fuel
for continuous and ever-growing
expansion is something that's actually
causing quite severe problems for us in
the present here's a graph that goes the
other way so not Moore's law going
always up and to the right
but this is a graph of something that
people in the pharmacological sciences
people are working on development of new
drugs coined eroom's law that's Moore's
law backwards because this is a
discovery they've made that in fact over
the last 20-30 years as more and more
computers have been thrown at the
problem of drug discovery the results
have actually got less we're discovering
that the larger and larger datasets and
more and more powerful computation are
not actually helping us with forms of
discovery that we need it's actually
getting harder and harder to sort
through this data even with the newest
tools purely using computational methods
what's fascinating about this is that
many pharmacological companies are now
actually changing their practices so
they don't just rely on kind of massive
data sets and powerful computational own
but actually start to return to the idea
of having small teams of scientists
working on hunches essentially working
on field using their own human
experience in opposition to the purely
computational thinking and this failure
of computation alone to accurately
predict and assist us in the future is
being replicated everywhere in fact in
the very first thing that we set out to
measure and predict in the first place
which is the weather these images of
turbulence in the North Atlantic from
the latest research papers the the
atmosphere as I'm sure you're aware due
to climate change is warming not
predicted not you know predictably or
unpredictably in the future but right
now these are graphs of the current
situation as the atmosphere warms air
masses in their behavior become less
predictable as a result huge areas of
atmosphere sheer against each other
producing what's called clear air
turbulence clear air turbulence is
specifically the turbulence that comes
out of clear air it's totally
unpredictable and it's getting worse as
a result of climate change and we can't
predict it or the other whether that's
occurring because the only thing we have
to go on is past data which because of
climate change is no longer the case our
entire practice of using past data to
predict the future is starting to fail
because of climate change
also as a result of these technological
ideas we've inherited from the history
of Technology itself and in fact of
course technology is one of the main
drivers of climate change not just
generally in terms of the legacy
technologies of fossil fuels that we use
all the time but also in contemporary
technologies the Internet is a vast
driver climate change in itself the
electricity required to do all the kinds
of big data or AI processing you might
be here today is a massive driver of
climate change equivalent at least to
the whole of the airline industry so
we're already contributing to this
through the networks that we're hoping
will kind of get us out of it
and we're not going to be able to think
about this at all much more clearly for
very long this is another graph going up
into the right
this is co2 as measured at the morning
lower Observatory in Hawaii it's called
the Keeling curve it shows the
exponential growth in co2 in the
atmosphere that's been ongoing for over
well Ferb forever essentially but
increasing exponentially in the last 50
years what this graph shows is that we
surpassed 400 parts per million in the
atmosphere a couple of years ago what it
doesn't show is that indoor co2
regularly passes a thousand parts per
million it's probably somewhere close to
that in this room now it's not a lot of
ventilation we're all breathing in here
over-over excuse me a thousand parts per
million human cognitive ability drops by
20% you're dumber by breathing in co2
and we're actively increasing the co2 in
the atmosphere it's getting harder and
harder to think and our technology is
effectively contributing to this at
present one of my favorite examples of
this is a phenomena that Park Rangers in
the u.s. named death by GPS this is when
people have become so accustomed to
trusting into the technological systems
that they've been given that they follow
them wherever they go
they've had people die in the middle of
Death Valley because they've driven
their cars down dirt tracks following
that little bright line or examples of
people driving into rivers or into the
sea because the map shows that that's
where the road goes we've we've given so
much over to these computational systems
that we're losing
our ability to think for ourselves and
in case you think that's just something
that basically stupid people do it's
actually an example of something that's
quite well-known in psychological and
neurological studies
it's called automation bias it means
basically that we trust technology or
well they're like but even us like deep
structures in our brain do this they've
put pilots inside highly sophisticated
simulators very well trained pilots
people with thousands of hours of flying
experience and they've put ins of you
simulated emergencies in which the
pilots know exactly what they need to do
and then they put in some kind of
automated system that at a critical
moment suggests they do the wrong thing
and 90% of the time even highly trained
people follow the bad advice with
hideous consequences when given that
advice by an automated system that they
trust our brains basically like the easy
way out they're very easy to
short-circuit particularly with
technology and things in which we trust
of course not all of us even have the
choice about following these automated
systems because of the ways that we're
designing them into society this is a
picture of an Amazon warehouse which is
an absolutely kind of fascinating
infrastructure that displays some of the
qualities of the kind of spaces that I'm
talking about and this is one of the
places where you have an any act like
room of computation although you
wouldn't really know by looking at it
that you're looking at a computer but
you kind of are Amazon uses this really
extraordinary thing called chaotic
storage so what happens is basically if
you're a large e-commerce company and
you have millions if not billions of
things for sale people don't order them
alphabetically right they don't order
them nicely essentially they all do a
bunch of random stuff and if you have a
vast warehouse you don't want the people
picking those things to have to walk
like two kilometers this way to get that
thing and then two kilometers back this
way right you need to group them
according to how people actually order
them and they use an algorithm to do
this the chaotic storage algorithm which
means basically on these shelves you
might find a book next to a DVD next
some cleaning products next
and bath products whatever it is the
algorithm has decided are likely to be
bought together the result of this is
that this space is completely
unnavigable to humans it looks like
chaos you have no chance of finding
something in this which is why employees
who work on this warehouses wear wrist
mounted devices that guide them like
gps's around the space they're
completely automated by a machine the
side-effects of this of course are that
it's also possible to monitor the
employees totally to know how long their
lunch breaks are when they take toilet
breaks are it's possible to D skill the
workforce there's no longer any
incentive to educate your workforce when
all they have to do is follow a guide
like this and and it's also very
possible to monitor them and keep an eye
on possibly who's talking to each other
who's planning to unionize to have full
and utter total control over your
employees and everything outside that as
I say including the the lack of any
necessity of educating your employees is
damaging to society more broadly as well
there's something very weird I think
that we're designing so many of the
technological tools we use every day to
effectively hide things from us right
these are these are what are known as
dark kitchens if you use a delivery app
in a big major city in the UK it's
things like uber eats and delivery and
the demand has so far outstripped the
supply possible from the restaurants the
popular restaurants basically put up
these containers in car parks where you
have chefs working 12-hour shifts to
provide meals for for the orders the
distance between this and the image that
we sell of technological convenience is
extraordinary I find it amazing that we
put so much effort into hiding the labor
and the work that actually goes in to
providing us the lives we want and of
course this isn't just the level of
delivery apps this is happening across
almost everything that we do for
everything that's made technologically
convenient something is hidden and it's
usually people who are worse off and
getting worse off because of these tools
the examples of this just extraordinary
I am fascinated by the phenomenon of
Pokemon go
I'm sure there's some people here who
play this but I'm also highly aware that
most people who play it are not aware
that many of the locations that they are
taken to by the app have been sold to
advertisers so that you literally follow
to find your pokemon gym or the or the
you know high-value Pokemon and you
suddenly find yourself deposited at the
front door of a fast-food restaurant or
a particular shop that's been personally
identified for you by the various
analytics that the company holds on you
and that it a granade sfrom everywhere
else you think there's just not just the
Amazon workers who are being directed
around step by step for your convenience
it's also millions of millions of people
playing alternative reality games who
have literally no idea that they're
being walked through the world and
guided through it by forces that they
have literally no idea about and this
has in other places absolutely
devastating societal effects one of the
places in which this plays out
incredibly clearly is in YouTube which
is frankly a cesspit of awful things but
on what's particularly unpleasant tactic
is what is autoplay and its suggestion
algorithm autoplay and YouTube
suggestion algorithm are completely
uncoupled from any kind of idea of
societal value your ethics and this is
exhibit a this is Walter Cronkite
talking very sensibly very
straightforwardly about climate change
back in 1980 and these are the
suggestions that YouTube thinks that you
should watch next the succession of
videos debunking or attempt or claiming
to debunk climate change to say that
this is not real that this is not
something you listen to this is the
suggestion from a vast corporation that
has millions and millions of people
following it too and shapes their
opinions and their thoughts in very real
ways and there's a there's a there's
something that's happening here that's
actually quite well-documented there's a
lot of papers on this basically what you
have is you have an algorithm that's
been optimized for people's attention
and that's it all it wants to do is to
keep you watching for longer and it's
discovered
by accident but very realistically that
what people want is kind of contrary
opinions what they want his sensation or
they want is the discovery that they
know something that other people don't
it's a very basic human desire and so uh
YouTube essentially radicalizes people
it essentially takes you on a journey
from what may start in a very innocuous
place or even a sensible scientific
place and deliberately moves you into a
place of increased political paranoia
and uncertainty and falsity and this is
this is not what the algorithm was
designed to do but it's what happens
when technology is decoupled from any
wider context or social ethics and the
problem is for this
we're so easy to do this to my favorite
example of this men let's say are very
easy to do this to I'll narrow it down
to that and some of you may heard of
Ashley Madison which is was was a dating
website for people who wanted have
affairs people who wanted have affairs a
few years ago there was a they had a
huge data leak hackers got in they took
everything out they put it all on the
Internet it was very very embarrassing
for a lot of people but researchers went
through the data and they discovered
that though this was a website for men
and women it probably won't surprise you
too much to hear that 90% of its users
were men and in fact when they looked at
the accounts that were registered as
female they discovered that only about a
thousand of those were active the rest
of them were people who logged on once
gone you know and walked quietly away as
they should have done but those thousand
accounts each of them sent tens of
thousands of messages a day they were
completely automated and this site was
making millions and millions of dollars
it convinced millions and millions of
men to have sexy conversations with bots
and they were paying for it people are
incredibly easy to trick in such a way
there's certain things that work better
than others but this attends as I say
across pretty much all of our our social
networks and really therefore into
politics and society
broadly it's not just about trickery
either it's not just about confusing
people or getting them to do something
or change their minds a lot of a lot of
the arguments around the role of
technology that that play out in the
political sphere are this idea that
we're going to you know change people's
minds and make them do something they
wouldn't do anyway that's not always the
intention and this is really key and the
intention is often just confusion itself
this is the building in Moscow that
houses something called the science and
Petersburg that houses something called
the Internet research agency which is a
Russian government-funded
disinformation machine essentially
hundreds of people work there is
essentially professional trolls leaving
comments on on websites sharing
disinformation online and so on and so
forth there's a really interesting
interview with someone who worked at the
internet research agency who describes
their strategy which actually describes
the kind of Russian government strategy
in general and probably a number of
other states they said like we realized
a long time ago it actually wasn't
possible to change people's minds what
we want to do is we want to muddy the
waters we want to make the Internet so
horrible that nobody's sensible war and
I have anything to do with it they're
just gonna poison the discourse and
that's very easy to do because it's so
easy to manipulate and be manipulated
through these systems and because
there's for most people so hard to
understand and so poorly designed and so
much we know is made invisible by them
that it's very hard for us to make
informed decisions or to think clearly
about what's actually happening on these
systems and we're doing this
deliberately right we're continuing to
do this all the time I I find it
extraordinary that so much contemporary
attention is paid to systems which are
increasingly intended to confuse us
further
these are outputs from a neural network
this is a presentation I saw a few weeks
ago that showed how brilliantly a bunch
of researchers could generate entirely
new faces fake people that were not in
fact none of these are photographs of
actual people these are all
computer-generated images which
then be masked on to film or on to
television or into the news to produce
entirely fake realities and this is also
what's coming down the line if it's not
as I very much suspect pretty much in
operation already the point in which
Trust breaks down across pretty much all
frontiers and for me this is really
critical to our understanding of the
world today when people don't understand
how the things they use actually work
when they know that information is being
deliberately withheld from them that's
deeply distressing it undermines our
sense of self and our sense of agency
and the result of that is confusion it's
fear and it's anger which are as I'm
sure you'll agree the dominant emotions
of social and political life across most
of our societies today I think there's a
concrete and causal relationship between
the opacity of the technologies we use
every day the way those things are
constructed and the the lack of general
understanding and the lack of ethics
that exists within these technologies
that we use today we live in confused
and fearful times and it's producing the
politics we have and we're failing in
our duty to assist with that education
and with that understanding that would
change that so I talked about this this
darkness quite a lot and I don't have a
lot of time left and I don't want to go
out in a complete bummer because I do
that all the time and it's something to
do my head in as much as I'm sure it
does in yours and so I've been I've been
trying to think about how I end this
talk without without it being completely
awful before you have lunch and I'm
really opposed to the idea of solutions
I'm opposed to solution ism in general
because for me it's part of the problem
the idea that I here's a problem or we
build an app for that is exactly the
problem that we're in so I don't talk
about solutions I don't talk about
answers to this I talked about maneuvers
and here am i here are three small
maneuvers or three small stories that
possibly undo some of that other stuff
the first one is how we can use the
technologies we have to produce
radically different outcomes and my
favorite example
this is a strike by delivery drivers in
London as I said earlier when you're
following that little device around when
you have no control over your direction
and your contact it can be very hard to
create a union to argue for better
working conditions to confront your boss
so what delivery drivers in London did
was a few of them have managed to get
together on an online forum and they
went to the the company's offices and
they stuck they started using the app to
order pizza to them okay so they they
started to get more and more drivers to
come in and they built a protest through
the app by using it to actually
introduce them to other workers rather
than alienating them from them another
example is of repurposing or rethinking
what it is we want these technologies to
be doing to literally turn them around
this is a a US spy satellite these are
vast incredible technologies of
extraordinary power that are like you
know the original ENIAC computer mostly
designed to be pointed at us to be used
as weapons something really weird
happened a few years ago something I
really loved I imagine it happening a
bit like this and macey at a conference
or something someone from the National
geospatial agency which is like the even
more secret than the NSA us by satellite
agency sidled up to someone from NASA
and when John a couple of satellites and
they were like right and basically
turned out the National geospatial
agency had two hubble quality space
telescopes sitting on the shelf that
they never used and were clearly
obsolete and they clearly got something
way better and more scary now but they
donated these two satellites to NASA
NASA basically said yes we'll take them
they're currently repurposing one of
them into something called the wide
field in Ferrari spectrometry yes but a
telescope basically a new incredibly
powerful space telescope they're going
to use to search for new galaxies for
for new inhabitable worlds for
incredible scientific achievements but
it's this beautiful image of like taking
a technology that was designed to be
aimed down at us
right and just literally flipping it
around and looking out and seeing what
we could discover instead with us for
all of our benefits and finally I want
to insist that none of this is about the
technology itself the technology itself
is not inherently opaque it's not
inherently complex and impossible to
understand it's not inherently dangerous
and this has always been true right this
is this is what I think I was one of the
first technologies of democracy this is
a thing called the claret area which is
AB largest stone which used to stand in
the ancient Agora of Athens in 300 BC
when they invented democracy right and
it was a machine for administering
democracy right in a beautiful way
the entire suffrage which to be clear
was only free adult property-owning men
we can do a lot better but the entire
suffrage would come down and those who
were chosen by lot would insert little
ID tags into the front of this stone and
someone would pour a set of balls down a
tube and according to the colors of
those balls black and white be the
people whose ID tags corresponded would
be put in charge right they actually had
a system that was based on what's called
sortition rather than elections our
people in choice were chosen by lot
which is also a thing that I totally
think should come back and but my point
is this the technology that ran that
democracy was something that stood in
the middle of the market square that was
visible to everyone that anyone could
come down and they could see it at work
and they could understand and fully
participate in the system that they're
engaged in it's a simple technology it's
got more complicated but the values it
holds don't change we can think about
the context of the things that we build
and we can work to make them educative
or more more just an increasing of
equality rather than intending to
confuse and intending to overcome to
predict take the place of people and
essentially to remove their agency so
please anything think about working on
things like face telescopes and claret
area and projects like this that
actually will insist us of getting out
of the morass that were presently in
thank you very much
you
you
Weitere ähnliche Videos ansehen
Putting human beings at the center of the 4th Industrial Revolution | Nicholas Davis | TEDxCarouge
Will technology shape our future or will we | Deborah Nas | TEDxAlkmaar
Modularity as a basis for innovation: George Heineman at TEDxWPI
Is Computer Science still worth it?
49. Weather Forecasts | The Economics of Everyday Things
Digital Transformation: Interview with David Edgerton, King’s College London
5.0 / 5 (0 votes)